During the WWDC 23 event held recently, Apple introduced iOS 17. While making sharing easier with iOS 17, Apple also ensures the protection of its users. The company aims to safeguard iPhone users against sensitive content.
iOS 17 will protect users from unwanted explicit visuals.
Apple announced that the new operating system will include a Sensitive Content Warning feature, which will assist users in avoiding unwanted explicit photos and videos. With the update in iOS 17, users will now be able to reject sensitive content before even viewing it.
This feature will work not only in the Messages app but also in AirDrop, contact cards, FaceTime, and the Photos app. iOS 17 will utilize machine learning to detect and blur explicit materials within these applications. This system will also work for videos.
In addition to this feature, the technology giant from the United States will also protect child users. In situations where underage users come across such content, they will be able to send messages to trusted adults for assistance.
Apple emphasizes that this protective system will not have any access to the content. To utilize this service, which is offered solely for protection purposes, users need to enable the Family Sharing feature and designate specific accounts as belonging to children.
In recent times, several governments have begun to classify unwanted explicit images as a crime. In this context, Apple also aims to protect its users with this initiative. Although Communication Security and Sensitive Content warning systems cannot completely prevent explicit content, they can at least keep users away from potentially traumatic visuals.
With the feature it will introduce in iOS 17, Apple aims to protect all of its users from unwanted explicit visuals. What are your thoughts on iOS 17 and its new features? Feel free to share your opinions in the comments section.
{{user}} {{datetime}}
{{text}}